/*******************************************************************************************************************
INSTALLATION INSTRUCTIONS - Release C2.0.7 (May 30, 2007)

This section contains the step-by-step installation instruction to create the STORET Central Warehouse, and install the associated Web Application.  There are several noteworthy items regarding installation and batch processing:

Installation Notes:

	Edit this script for database usernames, passwords, and connect strings.

.	Set the Step 1b. variable values. These are used by WQX_ETL database changes in script install_wh_db_changes.sql

 	Edit all Extract, Transform, and Load (ETL) scripts for tablespace names.

	Scripts must be run in the order listed to ensure correct software installation.

	A rollback segment named RBSBIG must exist and be on-line.

	Oracle archiving should be turned off when running the ETL scripts.

	Log files are created for each of the Structured Query Language (SQL) scripts listed in these instructions.  These log files have parallel names and directory locations as the scripts that create them.  Log files should be checked for errors after each step of the installation process to ensure the installation completes successfully.

.	The C2.0.6 release of the Central Warehouse requires Oracle 9i (9.2.0.7) and JServer.

	Copy the storetw directory from the "Release C2.0.6\software" directory to your C:\ drive.

	The installation package is not dependent upon a specific drive letter.  If the installation package is placed on a different drive, replace "C:" with the letter of that drive in the following instructions.

. 	Install Oracle Spatial into the ETL database before running this ETL.

.	An optional script has been provided (i.e., tbs_coalesce.sql) to coalesce the STORETDATA and STORETINDX tablespaces for improved performance.  Execute this script if the storetw schema is being imported into another database (e.g., the ETL loaded storetw schema is moved to a different production database).  Connect to the database with Database Administrator (DBA) privileges to run the tbs_coalesce.sql script.

.	The following two objects store data request history.  Do not drop/modify them in production during the refresh cycle.
	  -- Table:     dw_data_requests
	  -- Sequence:  seq_dw_data_request_id

Batch Processing Notes:

. 	To report on batch activity, run "dw_batch_activity_report.sql" to generate the dw_batch_activity_report.log file.  Provide report start and end dates.

.	To manually process all pending Immediate batch jobs, run "exec dw_process_pendingbatch_imm".

.	To manually process submitted batch jobs by request_id, run "exec dw_process_data_request_list('x')" where x is a comma separated list of request_ID(s).

.       Monthly Statistic (dw_monthly_stats) and Pending Request (dw_report_pending_requests) reporting, and Overnight batch processing (dw_process_batch_requests) are scheduled and run periodically without manual intervention.  They can also be run manually using the following:
          -- To manually report statistics for a given month, run "exec dw_monthly_stats(x)" where x is a valid date for the desired month.
          -- To manually report pending requests, login to STORETW and run "exec dw_report_pending_requests". 
	  -- To manually process batch jobs submitted for overnight, run "exec dw_process_batch_requests".

.	To change scheduled job timings, modify appropriate variable values in dw_glob_var.sql, run dw_glob_var.sql, then run dw_schedule_batch_jobs.sql.

.	To change online and batch processing thresholds, modify appropriate variable values in the dw_glob_var.sql, then run dw_glob_var.sql.


12/24/2006 grant execute on sys.utl_file to storetw;
12/24/2006 SET job_queue_processes greater than 0 in the initialization parameter file.  This is necessary to run Oracle schedule jobs.  User data requests are scheduled as Oracle scheduled jobs.
********************************************************************************************************************/

--1.	Open a SQL*Plus session.

--1b. Set the following variable values. These are used by WQX_ETL database changes in script install_wh_db_changes.sql.

set define on
SET VERIFY ON

define DBA_ACCOUNT=SYSTEM
define DBA_ACCOUNT_PASS=STO2ET

define WH_DB_CONNECTSTRING=STORET.ETL
--define WH_DB_CONNECTSTRING=epad9

define WH_SCHEMA=STORETW
define WH_SCHEMA_PASS=STORETW

define WH_ETL_SCHEMA=STORETW_ETL
define WH_ETL_SCHEMA_PASS=STORETW_ETL

define WH_ETL_ROLE=STORETW_ETL_ROLE
-----------------End of step 1b. ------------

--2a.	Create storetw schema with appropriate system and object privileges.
CONNECT SYSTEM/STO2ET@storet.sdc
alter session set nls_length_semantics='CHAR';
	@C:\storetw\storetw\create_storetw.sql;
CONNECT STORET/STO2ET@storet.sdc
alter session set nls_length_semantics='CHAR';
	@C:\storetw\storetw\storet_grants.sql;
CONNECT STORET1/STO2ET1@storet.sdc
alter session set nls_length_semantics='CHAR';
	@C:\storetw\storetw\storet1_grants.sql;
CONNECT SYSTEM/STO2ET@storet.sdc
alter session set nls_length_semantics='CHAR';
	revoke create session from storet, storet1;

--2b.	Update STORET.TSRUOM table.
	@C:\storetw\update_TSRUOM.sql;

--2c.	Create STORET and STORET1 indexes if they are missing.
	@C:\storetw\cr_appl_inds.sql;

--3.	Create the functions used by the ETL scripts.
CONNECT STORETW/STORETW@storet.sdc
alter session set nls_length_semantics='CHAR';
        @C:\storetw\functions\etl_glob_var.sql;
	@C:\storetw\functions\f_createflbrmk.sql;
	@C:\storetw\functions\f_estry.sql;
	@C:\storetw\functions\f_mad.sql;
	@C:\storetw\functions\f_station_visited.sql;
	@C:\storetw\functions\f_fieldset.sql;
	@C:\storetw\functions\f_createFNAICS.sql;
	@C:\storetw\functions\f_biopt.sql;
	@C:\storetw\functions\f_prmvl.sql;
	@c:\storetw\functions\f_char_name.sql;

--4.	Create FA_STATION table.  
	@C:\storetw\fa_statn\fa_station_table.sql;

--5a.	Create Characteristic tables.
	@C:\storetw\char_stn\char_stn_table.sql;
	@C:\storetw\char_stn\char_table.sql;
	@C:\storetw\char_stn\lu_chartype_tables.sql;

--5b.	Create LU_CHAR_ALIAS AND LU_CHAR_ALIAS_TYPE tables.
	@C:\storetw\lu_char_alias\lu_char_alias.sql;

--6.	Create Drainage Basin tables.
	@C:\storetw\drainagebasin\db_tables.sql;

--7.	Create Geographical tables (state/county).
	@C:\storetw\geopa\geo_tables.sql;

--8.	Create Method and Datum (MAD) code tables.
	@C:\storetw\mad\mad_tables.sql;
        @C:\storetw\mad\stn_std_datum.sql;

--9.	Create Organization table.
	@C:\storetw\org\org_tables.sql;

--10.	Create Station Types table.
	@C:\storetw\statn_types\statn_types_tables.sql;

--11.	Create Estuary tables.
	@C:\storetw\estuary\estry_tables.sql;

--12.	Create DI_DATE table.
	@C:\storetw\di_date\di_date_table.sql;

--13.	Create UOM table with Unit Conversion factors.
	@C:\storetw\unit_conversion.sql;

--14.	Create FA_REGULAR_RESULT table. 
	@C:\storetw\fa_regular_result\fa_regular_result_table.sql;
	@C:\storetw\fa_regular_result\fa_regular_result_data1.sql;
	@C:\storetw\fa_regular_result\fa_regular_result_pdl_data1.sql;
	@C:\storetw\fa_regular_result\create_temp_tables.sql;
 	@C:\storetw\fa_regular_result\update_temp_tables.sql;
	@C:\storetw\fa_regular_result\fa_regular_result_data2.sql;

--15.   Create FA_BIOLOGICAL_RESULT table.
        @C:\storetw\fa_biological_result\fa_bio_result_table.sql;
	@C:\storetw\fa_biological_result\fa_bio_result_data1.sql;
	@C:\storetw\fa_biological_result\create_bio_temp_tables.sql;
	@C:\storetw\fa_biological_result\update_bio_temp_tables.sql;
	@C:\storetw\fa_biological_result\update_bio_temp_tables2.sql;
	@C:\storetw\fa_biological_result\fa_bio_result_data2.sql;

--16.   Create FA_HABITAT_RESULT table.
        @C:\storetw\fa_habitat_result\fa_habitat_result_table.sql;
	@C:\storetw\fa_habitat_result\fa_habitat_result_data1.sql;
	@C:\storetw\fa_habitat_result\create_habitat_temp_tables.sql;
	@C:\storetw\fa_habitat_result\update_habitat_temp_tables.sql;
	@C:\storetw\fa_habitat_result\fa_habitat_result_data2.sql;

--17.	Create DI_ACT_MEDIUM table.
	@C:\storetw\di_act_medium\di_act_medium.sql;

--18.	Create result fact table indexes.
	@C:\storetw\fa_regular_result\fa_regular_result_index.sql;
	@C:\storetw\fa_biological_result\fa_bio_result_index.sql;
	@C:\storetw\fa_habitat_result\fa_habitat_result_index.sql;

--19.	Create LU_CHAR_ALIAS AND LU_CHAR_ALIAS_TYPE tables.
	--@C:\storetw\lu_char_alias\lu_char_alias.sql;

--20.	Create DI_PROJECT table and the associated relationship tables.
	@C:\storetw\di_project\di_project_table.sql;
	@C:\storetw\di_project\project_rel_tables.sql;

--21.	Create the LU_STATION_ALIAS table.
	@C:\storetw\lu_station_alias\lu_station_alias.sql;

--22.   Create the remaining dimension tables (i.e., DI_ACTIVITY_INTENT, DI_COMMUNITY_SAMPLED, DI_SUBJECT_TAXON, DI_GROUP_TYPE, DI_BIOPART, and DI_ACTIVITY_MATRIX).
        @C:\storetw\di_activity_intent\di_activity_intent.sql;
	@C:\storetw\di_community_sampled\di_community_sampled.sql;
	@C:\storetw\di_subject_taxon\di_subject_taxon.sql;
	@C:\storetw\di_group_type\di_group_type.sql;
	@C:\storetw\di_biopart\di_biopart.sql;
	@C:\storetw\di_activity_matrix\di_activity_matrix.sql;

--23.	Create the BLOB table and add BLOB IDs to the fact tables.
	@C:\storetw\fa_blob\fa_blob.sql;

--24.	Finish building the FA_REGULAR_RESULT table.
	@C:\storetw\fa_regular_result\fa_regular_result_const.sql;
	@C:\storetw\fa_regular_result\fa_regular_result_drop_cols.sql;

--25.	Finish building the FA_BIOLOGICAL_RESULT table.
	@C:\storetw\fa_biological_result\fa_bio_result_const.sql;
	@C:\storetw\fa_biological_result\fa_bio_result_drop_cols.sql;

--26.	Finish building the FA_HABITAT_RESULT table.
	@C:\storetw\fa_habitat_result\fa_habitat_result_const.sql;
	@C:\storetw\fa_habitat_result\fa_habitat_result_drop_cols.sql;

--26.5 Create metadata tables 
	@c:\storetw\metadata_tables\org_phy_addr.sql
	@c:\storetw\metadata_tables\org_elec_addr.sql
	@c:\storetw\metadata_tables\co_org_table.sql
	@c:\storetw\metadata_tables\co_op_addr.sql
	@c:\storetw\metadata_tables\co_op_eaddr.sql
	@c:\storetw\metadata_tables\lab.sql
	@c:\storetw\metadata_tables\lab_addr.sql
	@c:\storetw\metadata_tables\lab_eaddr.sql
	@c:\storetw\metadata_tables\lab_analytical_proc.sql
	@c:\storetw\metadata_tables\lab_sample_prep_proc.sql
	@c:\storetw\metadata_tables\program_table.sql
	@c:\storetw\metadata_tables\sample_collection_proc.sql
	@c:\storetw\metadata_tables\sample_gear.sql
	@c:\storetw\metadata_tables\sample_preservation.sql
	@c:\storetw\metadata_tables\citation_table.sql

--27.	Drop unnecessary columns from FA_STATION table.
	@C:\storetw\fa_statn\fa_station_drop_cols.sql;

--28.	Analyze database objects and performance tune.
	@C:\storetw\analyze_objects.sql;
	@C:\storetw\performance_tuning.sql;

--29.	Delete temporary tables.
	@C:\storetw\drop_temp_tables.sql;

--30.	Update User Defined Habitat Result columns.
	@C:\storetw\fa_habitat_result\DW_user_def_hab.sql;

--31. Update tables and indexes storage parameters.
	@C:\storetw\gen_tblstrg.sql;
	@C:\storetw\gen_idxstrg.sql;

--The ETL process for generating the Data Warehouse is complete.

--32.	Create report customization related tables, other application tables, materialized views and table/column comments.
CONNECT STORETW/STORETW@storet.sdc
alter session set nls_length_semantics='CHAR';
	@C:\storetw\application\APP_COLUMN_NAME.sql;
	@C:\storetw\application\APP_COLUMN_NAME_UPD.sql;	
	@C:\storetw\application\DW_data_requests_table.sql;
	@C:\storetw\materialized_views.sql;

	@C:\storetw\column_comments.sql;
	@C:\storetw\table_comments.sql;

	-- Watershed summary related web services and UI tables  including station_char refresh.
	@C:\storetw\WQX_ETL\MT_char_chartype.sql
	@C:\storetw\web_services\refresh_station_char.sql
--	@C:\storetw\web_services\create_ws_station_data.sql
	@C:\storetw\web_services\create_station_summary_view.sql
	@C:\storetw\web_services\create_ws_huc_org_summary.sql
	@C:\storetw\web_services\create_ws_huc_char_summary.sql
	@C:\storetw\web_services\create_ws_huc_org_char_summary.sql
	@C:\storetw\web_services\create_ws_station_summary.sql
	@C:\storetw\web_services\alter_station_project.sql
	@C:\storetw\web_services\ws_station_view.sql
	@C:\storetw\web_services\insert_web_methods_data.sql

	--@C:\storetw\web_services\ws_synonyms.sql

	-- apply WQX ETL database chabges
	@C:\storetw\WQX_ETL\install_wh_db_changes.sql

CONNECT STORETW/STORETW@storet.sdc

	-- analyze application objects
	@C:\storetw\analyze_appl_objects.sql;

--The ETL process is now complete.  Perform the remaining steps for installing the web interface.

--33.	Alter the file C:\storetw\application\DW_glob_var.sql to reflect the directory structure of the environment where the software is being installed.  The global variables established in this package should have values identical to the values in the glob_var.sql file included with the existing STORET Web Application with one exception; the file name section of the variable lv_script_path should be set to "DW_storet.js".  Also update the threshold values for batch process and national projects.  Provide timings for scheduled jobs, STORET email address, SMTP Host and SMTP Port information.

--34.	Create the global variables package as storetw.
	@C:\storetw\application\DW_glob_var.sql;

--35.	Compile Java procedures used by the web application procedures.
	@C:\storetw\application\procedures\DW_exec_os.java;
	@C:\storetw\application\procedures\DW_blob_handler.java;

--36.	Create the web application procedures as storetw.
	@C:\storetw\application\procedures\DW_mail_pkg.sql;
	@C:\storetw\application\procedures\DW_record_data_request.sql;
	@C:\storetw\application\procedures\DW_huc_popup.sql;
	@C:\storetw\application\procedures\DW_counties_popup.sql;
	@C:\storetw\application\procedures\DW_top_of_page.sql;
	@C:\storetw\application\procedures\DW_bottom_of_page.sql;
	@C:\storetw\application\procedures\DW_display_calendar.sql;
	@C:\storetw\application\procedures\DW_geo_select.sql;
	@C:\storetw\application\procedures\DW_date_select.sql;
	@C:\storetw\application\procedures\DW_char_select.sql;
	@C:\storetw\application\procedures\DW_project_select.sql;
	@C:\storetw\application\procedures\DW_station_select.sql;
	@C:\storetw\application\procedures\DW_medium_select.sql;
	@C:\storetw\application\procedures\DW_community_select.sql;
	@C:\storetw\application\procedures\DW_selection_criteria.sql;
	@C:\storetw\application\procedures\DW_selection_criteria_station.sql;
	@C:\storetw\application\procedures\DW_station_count.sql;
	@C:\storetw\application\procedures\DW_station_download_custom.sql;
	@C:\storetw\application\procedures\DW_station_hub_custom.sql;
	@C:\storetw\application\procedures\DW_proj_popup.sql;
	@C:\storetw\application\procedures\DW_char_alias_popup.sql;
	@C:\storetw\application\procedures\DW_station_popup.sql;
	@C:\storetw\application\procedures\DW_extref_popup.sql;
	@C:\storetw\application\procedures\DW_result_criteria_geo.sql;
	@C:\storetw\application\procedures\DW_result_criteria_project.sql;
	@C:\storetw\application\procedures\DW_result_criteria_station.sql;
	@C:\storetw\application\procedures\DW_bio_result_criteria_geo.sql;
	@C:\storetw\application\procedures\DW_bio_result_criteria_project.sql;
	@C:\storetw\application\procedures\DW_bio_result_criteria_station.sql;
	@C:\storetw\application\procedures\DW_hab_result_criteria_geo.sql;
	@C:\storetw\application\procedures\DW_hab_result_criteria_project.sql;
	@C:\storetw\application\procedures\DW_hab_result_criteria_station.sql;
	@C:\storetw\application\procedures\DW_result_count.sql;
	@C:\storetw\application\procedures\DW_result_download_custom.sql;
	-- metadata procedure
	@C:\storetw\application\procedures\dw_project_blob_md.sql;
	@C:\storetw\application\procedures\DW_result_metadata.sql;
        --

	@C:\storetw\application\procedures\DW_result_hub_custom.sql;
	@C:\storetw\application\procedures\DW_home.sql;
	@C:\storetw\application\procedures\DW_blob_download.sql;
	@C:\storetw\application\procedures\DW_station_download_zip.sql;
	@C:\storetw\application\procedures\DW_result_download_zip.sql;
        -- batch related scripts.
	@C:\storetw\application\procedures\DW_process_data_request.sql;
	@C:\storetw\application\procedures\DW_process_batch_requests.sql;
	@C:\storetw\application\procedures\DW_report_pending_requests;
	@C:\storetw\application\procedures\DW_process_data_request_list.sql;
	@C:\storetw\application\procedures\DW_batch_download_pickup.sql;
	@C:\storetw\application\procedures\DW_process_pendingbatch_imm.sql;
	@C:\storetw\application\procedures\DW_monthly_stats.sql;
	@C:\storetw\application\DW_schedule_batch_jobs.sql;
	-- metadata procedure
	--@C:\storetw\application\procedures\dw_project_blob_md.sql;
	--@C:\storetw\application\procedures\DW_result_metadata.sql;

--37.	Copy the application's JavaScript library file into the scripts directory indicated in the global variables script (see step 33).
	--C:\storetw\application\scripts\DW_storet.js

--38.	Copy the application's Help and the associated image files into the Hyper-Text Markup Language (HTML) documents directory indicated in the global variables script (see step 33).
	--C:\storetw\application\doc\DW_storet_help.html
	--C:\storetw\application\doc\lat_long_ex.gif

--39.	Alter the file C:\storetw\application\DW_PERMISSIONS.sql to reflect the environment where the software is being installed.  This script includes JAVA permissions that require a username/password with DBA privileges.  The script must also be modified to include the correct directory location of the GZIP (file compression utility), cp and tar utilities and the location of the user reports directory (established in DW_glob_var).  

--40.	Grant appropriate privileges and create required synonyms.
	@C:\storetw\application\DW_GRANTS.sql;
CONNECT STORETWEB/STORETWEB@storet.sdc
alter session set nls_length_semantics='CHAR';
	@C:\storetw\application\DW_SYNONYMS.sql;
	@C:\storetw\web_services\ws_synonyms.sql

CONNECT SYSTEM/STO2ET@storet.sdc
alter session set nls_length_semantics='CHAR';
	@C:\storetw\application\DW_PERMISSIONS.sql;
/*
CONNECT STORETW/STORETW@STORET.SDC			
alter session set nls_length_semantics='CHAR';	
        @C:\storetw\dw_statistics.sql;
CONNECT STORUSER/STORUSER@STORET.SDC
alter session set nls_length_semantics='CHAR';
        @C:\storetw\org_summ_stormod.sql;
*/
--  Deploy stationcatalogservice_ear.ear, watershedsummaryservice_ear.ear and watershedsummaryUI_ear.ear onto Oracle Application Server 10g Release 2 based on instructions in the Version Description Document.


/*

Note:  To Delete all files that match the pattern ('Data_*.txt' or 'RefDoc_*.* )and that are >= 24 hours old
Add the following commands to the download directory cleanup cronjob's shell script:

NOTE: To delete files for each pattern we need 2 statements.

find /public/data/storpub1/web/modern/downloads -name 'Data_*.txt' -mtime +1 -exec rm -f {} \; 
find /public/data/storpub1/web/modern/downloads -name 'Data_*.txt' -mtime 1 -exec rm -f {} \;


find /public/data/storpub1/web/modern/downloads -name 'RefDoc_*.*' -mtime +1 -exec rm -f {} \; 
find /public/data/storpub1/web/modern/downloads -name 'RefDoc_*.*' -mtime 1 -exec rm -f {} \;

Note:  To change file permissions for all files < 24 Hours.
find /public/data/storpub1/web/modern/downloads -name '*.*' -mtime -1 -exec chmod 644 {} \;

Note:  storetw.sp_sequence_synch will synch sequence object nextval values with the related table.column maximum value.  manually run as necessary or schedule it using dbms_job to run few hours before WQX_ETL runs.


*/

DISCONNECT
--EXIT